Progressive Operational Perceptrons

نویسندگان

  • Serkan Kiranyaz
  • Turker Ince
  • Alexandros Iosifidis
  • Moncef Gabbouj
چکیده

There are well-known limitations and drawbacks on the performance and robustness of the feed-forward, fully-connected Artificial Neural Networks (ANNs), or the so-called Multi-Layer Perceptrons (MLPs). In this study we shall address them by Generalized Operational Perceptrons (GOPs) that consist of neurons with distinct (non-)linear operators to achieve a generalized model of the biological neurons and ultimately a superior diversity. We modified the conventional back-propagation (BP) to train GOPs and furthermore, proposed Progressive Operational Perceptrons (POPs) to achieve self-organized and depth-adaptive GOPs according to the learning problem. The most crucial property of the POPs is their ability to simultaneously search for the optimal operator set and train each layer individually. The final POP is, therefore, formed layer by layer and in this paper we shall show that this ability enables POPs with minimal network depth to attack the most challenging learning problems that cannot be learned by conventional ANNs even with a deeper and significantly complex configuration. Experimental results show that POPs can scale up very well with the problem size and can have the potential to achieve a superior generalization performance on real benchmark problems with a significant gain.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Synthetic Neuron Implementations

Many different synthetic neuron implementations exist, that include a variety of traits associated with biological neurons and our understanding of them. An important motivation behind the studies, modelling and implementations of different synthetic neurons, is that nature has provided the most efficient ways of doing important types of computations, that we are trying to mimick. Whether it is...

متن کامل

Memorandum CaSaR 92 - 25 Exact Classification With Two - Layered Perceptrons

We study the capabilities of two-layered perceptrons for classifying exactly a given subset. Both necessary and sufficient conditions are derived for subsets to be exactly classifiable with two-layered perceptrons that use the hard-limiting response function. The necessary conditions can be viewed as generalizations of the linear-separability condition of one-layered perceptrons and confirm the...

متن کامل

A Pilot Sampling Method for Multi-layer Perceptrons

As the size of samples grows, the accuracy of trained multi-layer perceptrons grows with some improvement in error rates. But we cannot use larger and larger samples, because computational complexity to train the multi-layer perceptrons becomes enormous and data overfitting problem can happen. This paper suggests an effective approach in determining a proper sample size for multi-layer perceptr...

متن کامل

A note on: Optimal ordering policy for stock-dependent demand under progressive payment scheme

In a recent paper, Soni and Shah [Soni, H., Shah, N. H., 2008. Optimal ordering policy for stock-dependent demand under progressive payment scheme. European Journal of Operational Research 184, 91-100] developed a model to find the optimal ordering policy for a retailer with stock-dependent demand and a supplier that offers a progressive payment scheme to the retailer. This note corrects some e...

متن کامل

Bounds on the Degree of High Order Binary Perceptrons Bounds on the Degree of High Order Binary Perceptrons

High order perceptrons are often used in order to reduce the size of neural networks. The complexity of the architecture of a usual multilayer network is then turned into the complexity of the functions performed by each high order unit and in particular by the degree of their polynomials. The main result of this paper provides a bound on the degree of the polynomial of a high order perceptron,...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Neurocomputing

دوره 224  شماره 

صفحات  -

تاریخ انتشار 2017